A Generative Model for Attractor Dynamics

نویسندگان

  • Richard S. Zemel
  • Michael C. Mozer
چکیده

Attractor networks, which map an input space to a discrete output space, are useful for pattern completion. However, designing a net to have a given set of attractors is notoriously tricky; training procedures are CPU intensive and often produce spurious afuactors and ill-conditioned attractor basins. These difficulties occur because each connection in the network participates in the encoding of multiple attractors. We describe an alternative formulation of attractor networks in which the encoding of knowledge is local, not distributed. Although localist attractor networks have similar dynamics to their distributed counterparts, they are much easier to work with and interpret. We propose a statistical formulation of localist attract or net dynamics, which yields a convergence proof and a mathematical interpretation of model parameters. Attractor networks map an input space, usually continuous, to a sparse output space composed of a discrete set of alternatives. Attractor networks have a long history in neural network research. Attractor networks are often used for pattern completion, which involves filling in missing, noisy, or incorrect features in an input pattern. The initial state of the attractor net is typically determined by the input pattern. Over time, the state is drawn to one of a predefined set of states-the attractors. Attractor net dynamics can be described by a state trajectory (Figure 1a). An attractor net is generally implemented by a set of visible units whose activity represents the instantaneous state, and optionally, a set of hidden units that assist in the computation. Attractor dynamics arise from interactions among the units. In most formulations of afuactor nets,2,3 the dynamics can be characterized by gradient descent in an energy landscape, allowing one to partition the output space into attractor basins. Instead of homogeneous attractor basins, it is often desirable to sculpt basins that depend on the recent history of the network and the arrangement of attractors in the space. In psychological models of human cognition, for example, priming is fundamental: after the model visits an attractor, it should be faster to fall into the same attractor in the near future, i.e., the attractor basin should be broadened. 1 ,6 Another property of attractor nets is key to explaining behavioral data in psychological and neurobiological models: the gang effect, in which the strength of an attractor is influenced by other attractors in its neighborhood. Figure 1b illustrates the gang effect: the proximity of the two rightmost afuactors creates a deeper attractor basin, so that if the input starts at the origin it will get pulled to the right. A Generative Model for Attractor Dynamics 81 k , " , " ,/ "-----.. Figure 1: (a) A two-dimensional space can be carved into three regions (dashed lines) by an attractor net. The dynamics of the net cause an input pattern (the X) to be mapped to one of the attractors (the O's). The solid line shows the temporal trajectory of the network state. (b) the actual energy landscape for a localist attractor net as a function of y, when the input is fixed at the origin and there are three attractors, W = ((-1,0), (1,0), (1, -A)), with a uniform prior. The shapes of attractor basins are influenced by the proximity of attractors to one another (the gang effect). The origin of the space (depicted by a point) is equidistant from the attractor on the left and the attractor on the upper right, yet the origiri clearly lies in the basin of the right attractors. 1bis effect is an emergent property of the distribution of attractors, and is the basis for interesting dynamics; it produces the mutually reinforcing or inhibitory influence of similar items in domains such as semantics,9 memory,lO,12 and olfaction.4 Training an attract or net is notoriously tricky. Training procedures are CPU intensive and often produce spurious attractors and ill-conditioned attractor basins. 5,11 Indeed, we are aware of no existing procedure that can robustly translate an arbitrary specification of an attractor landscape into a set of weights. These difficulties are due to the fact that each connection participates in the specification of multiple attractors; thus, knowledge in the net is distributed over connections. We describe an alternative attractor network model in which knowledge is localized, hence the name localist attractor network. The model has many virtues, including: a trivial procedure for wiring up the architecture given an attractor landscape; eliminating spurious attractors; achieving gang effects; providing a clear mathematical interpretation of the model parameters, which clarifies how the parameters control the qualitative behavior of the model (e.g., the magnitude of gang effects); and proofs of convergence and stability. A Iocalist attractor net consists of a set of n state units and m attractor units. Parameters associated with an attractor unit i encode the location of the attractor, denoted Wi, and its "pull" or strength, denoted 7ri, which influence the shape of the attractor basin. Its activity at time t, qi(t), reflects the normalized distance from the attractor center to the current state, y(t), weighted by the attractor strength: g(y, w, 0") 7rjg(y(t) , Wi, O"(t)) L:j 7rjg(y(t) , Wj, O"(t)) exp( -\y w\2/20"2) Thus, the attractors form a layer of normalized radial-basis-function units. (1)

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Attractor Based Analysis of Centrally Cracked Plate Subjected to Chaotic Excitation

The presence of part-through cracks with limited length is one of the prevalent defects in the plate structures. Due to the slight effect of this type of damages on the frequency response of the plates, conventional vibration-based damage assessment could be a challenging task. In this study for the first time, a recently developed state-space method which is based on the chaotic excitation is ...

متن کامل

بهبود بازشناسی مقاوم الگو در شبکه های عصبی بازگشتی جاذب از طریق به کارگیری دینامیک های آشوب گونه

In this paper, two kinds of chaotic neural networks are proposed to evaluate the efficiency of chaotic dynamics in robust pattern recognition. The First model is designed based on natural selection theory. In this model, attractor recurrent neural network, intelligently, guides the evaluation of chaotic nodes in order to obtain the best solution. In the second model, a different structure of ch...

متن کامل

Attractor Dynamics in Feedforward Neural Networks

We study the probabilistic generative models parameterized by feedforward neural networks. An attractor dynamics for probabilistic inference in these models is derived from a mean field approximation for large, layered sigmoidal networks. Fixed points of the dynamics correspond to solutions of the mean field equations, which relate the statistics of each unit to those of its Markov blanket. We ...

متن کامل

Model Based Method for Determining the Minimum Embedding Dimension from Solar Activity Chaotic Time Series

Predicting future behavior of chaotic time series system is a challenging area in the literature of nonlinear systems. The prediction's accuracy of chaotic time series is extremely dependent on the model and the learning algorithm. On the other hand the cyclic solar activity as one of the natural chaotic systems has significant effects on earth, climate, satellites and space missions. Several m...

متن کامل

A wake-sleep algorithm for recurrent, spiking neural networks

We investigate a recently proposed model for cortical computation which performs relational inference. It consists of several interconnected, structurally equivalent populations of leaky integrate-and-fire (LIF) neurons, which are trained in a selforganized fashion with spike-timing dependent plasticity (STDP). Despite its robust learning dynamics, the model is susceptible to a problem typical ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1999